cs-2 system
andrew-feldman-co-founder-ceo-of-cerebras-systems-interview-series
Andrew is co-founder and CEO of Cerebras Systems. He is an entrepreneur dedicated to pushing boundaries in the compute space. Prior to Cerebras, he co-founded and was CEO of SeaMicro, a pioneer of energy-efficient, high-bandwidth microservers. SeaMicro was acquired by AMD in 2012 for $357M. Before SeaMicro, Andrew was the Vice President of Product Management, Marketing and BD at Force10 Networks which was later sold to Dell Computing for $800M.
Cerebras sets record for largest AI model on a single chip
In brief US hardware startup Cerebras claims to have trained the largest AI model on a single device powered by the world's largest Wafer Scale Engine 2 chip the size of a plate. "Using the Cerebras Software Platform (CSoft), our customers can easily train state-of-the-art GPT language models (such as GPT-3 and GPT-J) with up to 20 billion parameters on a single CS-2 system," the company claimed this week. "Running on a single CS-2, these models take minutes to set up and users can quickly move between models with just a few keystrokes." The CS-2 packs a whopping 850,000 cores, and has 40GB of on-chip memory capable of reaching 20 PB/sec memory bandwidth. The specs on other types of AI accelerators and GPUs pale in comparison, meaning machine learning engineers have to train huge AI models with billions of parameters across more servers.
Cerebras Systems Thinks Forward on AI Chips as it Claims Performance Win
Cerebras Systems makes the largest chip in the world, but is already thinking about its upcoming AI chips as learning models continue to grow at breakneck speed. The company's latest Wafer Scale Engine chip is indeed the size of a wafer, and is made using TSMC's 7nm process. The next chip will pack in more cores to handle the fast-growing compute needs of AI, said Andrew Feldman, CEO of Cerebras Systems. "In the future it will be five nanometers and it will keep growing. There are always opportunities to improve the performance," Feldman said.
Cerebras brings CS-2 system to data analysis biz nference
AI chip startup Cerebras Systems has deployed one of its CS-2 systems at a well-funded startup that uses natural language processing to analyze massive amounts of biomedical data. As announced on Monday, nference plans to use this CS-2 to train large transformer models that are designed to process information from piles of unstructured medical data to provide fresh insights to doctors and improve patient recovery and treatment. The CS-2 is powered by Cerebras' second-generation, Wafer-Scale Engine processor, so-called because the chip is wafer-size. Cerebras said this deployment marks another significant customer win in the health care and life sciences space after installing similar systems at pharmaceutical giants GlaxoSmithKline and AstraZeneca as well as the US Department of Energy's Argonne National Laboratory for COVID-19-related research. Andrew Feldman, CEO of Cerebras, told The Register this installation at Massachusetts-based nference is another testament to Cerebras' belief that its wafer-sized AI chips are better suited than traditional chips like Nvidia's GPUs for analyzing large amounts of data as fast as possible, which is increasingly important in areas like health care and the life sciences.
- North America > United States > Massachusetts (0.26)
- North America > United States > California (0.06)
Cerebras Systems, G42 to Bring AI Compute Capabilities to the Region
Artificial intelligence (AI) compute solutions provider Cerebras Systems and G42, the UAE-based AI and cloud computing company, have signed a memorandum of understanding (MOU) at GMIS, under which they will bring high performance AI capabilities to the Middle East. G42, who manages the region's largest cloud computing infrastructure, will upgrade its technology stack with Cerebras' CS-2 systems to deliver AI compute capabilities to its partners and the broader ecosystem. "Cerebras, in partnership with our extraordinary customers, has achieved incredible breakthroughs that are transforming AI," said Andrew Feldman, CEO and Co-Founder of Cerebras Systems. "We are privileged to be working with G42, the Middle East's leader in AI innovation. Together we will transform our industry, making the impossible commonplace."
- Europe > Middle East (0.50)
- Africa > Middle East (0.50)
- Asia > Middle East > UAE (0.42)
This massive AI chip has the compute power of a human brain
Cerebras Systems said today that the company has achieved the computational equivalent of the human brain, or the equivalent of 100 trillion synapses. Cerebras manufactures what it calls the Wafer Scale Engine-1 and -2, a massive 46,225 sq. The company essentially mounts that chip inside of a standalone CS-2 system, about the size of a dorm refrigerator. Now, the company says that it's been able to surround the CS-2 with several different technologies to enable the brain-scale computational power to reach 120 trillion synapse equivalents, also called parameters. Cerebras isn't alone in trying to model machine learning at the chip level, in an effort to duplicate how the human brain works.